Shannon Entropy Estimation in $\infty$-Alphabets from Convergence Results

نویسنده

  • Jorge F. Silva
چکیده

The problem of Shannon entropy estimation in countable infinite alphabets is revisited from the adoption of convergence results of the entropy functional. Sufficient conditions for the convergence of the entropy are used, including scenarios with both finitely and infinitely supported distributions. From this angle, four plug-in histogram-based estimators are studied showing strong consistency and rate of convergences results for the case of finite and unknown supported distributions and families of distributions with summable tail bounded conditions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Estimation of Shannon Entropy

Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation st...

متن کامل

SHANNON ENTROPY IN ORDER STATISTICS AND THEIR CONCOMITANS FROM BIVARIATE NORMAL DISTRIBUTION

In this paper, we derive rst some results on the Shannon entropyin order statistics and their concomitants arising from a sequence of f(Xi; Yi): i = 1; 2; :::g independent and identically distributed (iid) random variablesfrom the bivariate normal distribution and extend our results to a collectionC(X; Y ) = f(Xr1:n; Y[r1:n]); (Xr2:n; Y[r2:n]); :::; (Xrk:n; Y[rk:n])g of order sta-tistics and th...

متن کامل

Universal Weak Variable-Length Source Coding on Countable Infinite Alphabets

Motivated from the fact that universal source coding on countably infinite alphabets is not feasible, this work introduces the notion of “almost lossless source coding”. Analog to the weak variable-length source coding problem studied by Han [3], almost lossless source coding aims at relaxing the lossless block-wise assumption to allow an average per-letter distortion that vanishes asymptotical...

متن کامل

Shannon entropy in generalized order statistics from Pareto-type distributions

In this paper, we derive the exact analytical expressions for the Shannon entropy of generalized orderstatistics from Pareto-type and related distributions.

متن کامل

Performance comparison of new nonparametric independent component analysis algorithm for different entropic indexes

Most independent component analysis (ICA) algorithms use mutual information (MI) measures based on Shannon entropy as a cost function, but Shannon entropy is not the only measure in the literature. In this paper, instead of Shannon entropy, Tsallis entropy is used and a novel ICA algorithm, which uses kernel density estimation (KDE) for estimation of source distributions, is proposed. KDE is di...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017